Increasing the Capacity of a Hopfield Network without Sacrificing Functionality

نویسنده

  • Amos J. Storkey
چکیده

Hoppeld networks are commonly trained by one of two algorithms. The simplest of these is the Hebb rule, which has a low absolute capacity of n=(2 ln n), where n is the total number of neurons. This capacity can be increased to n by using the pseudo-inverse rule. However, capacity is not the only consideration. It is important for rules to be local (the weight of a synapse depends ony on information available to the two neurons it connects), incremental (learning a new pattern can be done knowing only the old weight matrix and not the actual patterns stored) and immediate (the learning process is not a limit process). The Hebbian rule is all of these, but the pseudo-inverse is never incremental, and local only if not immediate. The question addressed by this paper is, `Can the capacity of the Hebbian rule be increased without losing locality, incrementality or immediacy?' Here a new algorithm is proposed. This algorithm is local, immediate and incremental. In addition it has an absolute capacity signiicantly higher than that of the Hebbian method: n= p 2 ln n. In this paper the new learning rule is introduced, and a heuristic calculation of the absolute capacity of the learning algorithm is given. Simulations show that this calculation does indeed provide a good measure of the capacity for nite network sizes. Comparisons are made between the Hebb rule and this new learning rule.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

محاسبه ظرفیت شبکه عصبی هاپفیلد و ارائه روش عملی افزایش حجم حافظه

The capacity of the Hopfield model has been considered as an imortant parameter in using this model. In this paper, the Hopfield neural network is modeled as a Shannon Channel and an upperbound to its capacity is found. For achieving maximum memory, we focus on the training algorithm of the network, and prove that the capacity of the network is bounded by the maximum number of the ortho...

متن کامل

High Capacity Neural Networks for Familiarity Discrimination

This paper presents two new novelty discrimination models for uncorrelated patterns based on neural modelling. The first model uses a single neuron with Hebbian learning and works well when the number of memorised patterns is less than 0.046N (N, the number of inputs). The second model is based on checking the value of the energy function of a Hopfield network. By sacrificing the ability to ext...

متن کامل

Hopfield Network as Associative Memory with Multiple Reference Points

Hopfield model of associative memory is studied in this work. In particular, two main problems that it possesses: the apparition of spurious patterns in the learning phase, implying the well-known effect of storing the opposite pattern, and the problem of its reduced capacity, meaning that it is not possible to store a great amount of patterns without increasing the error probability in the ret...

متن کامل

On the Maximum Storage Capacity of the Hopfield Model

Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories. In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN. Analyzing the thermodynamic limit of the statistical properties of the Hamiltonian corresponding to the Hopfi...

متن کامل

Estimation of Network Reliability for a Fully Connected Network with Unreliable Nodes and Unreliable Edges using Neuro Optimization

In this paper it is tried to estimate the reliability of a fully connected network of some unreliable nodes and unreliable connections (edges) between them. The proliferation of electronic messaging has been witnessed during the last few years. The acute problem of node failure and connection failure is frequently encountered in communication through various types of networks. We know that a ne...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997